Search results for "data integrity"

showing 10 items of 13 documents

Use of Information from Clinical Trials for an Integrated Cancer Registry

1990

AbstractThe registry of childhood malignancies in the F.R.G. is a combination of a population-based and hospital-based cancer registry. A large amount of the collected data originates from multicenter clinical trials which are integrated into the documentation system of the cancer registry. The paper describes the information flow and the system of data storage which consists of a central database on a departmental system at the registry and of several coordinated peripheral databases on microcomputers at the trial centers. Practical experience shows an increased availability and validity of the data in the registry since the implementation of the system. Aspects of data integrity and secur…

Advanced and Specialized Nursingeducation.field_of_studybusiness.industryPopulationHealth Informaticsmedicine.diseaseCancer registryClinical trialDocumentationHealth Information ManagementData integrityMedicineInformation flow (information theory)Medical emergencybusinessCentral databaseeducationInformation exchangeMethods of Information in Medicine
researchProduct

Attacks Against the WAP WTLS Protocol

1999

The WAP WTLS protocol was designed to provide privacy, data integrity, and authentication for wireless terminals. The protocol is currently being fielded, and it is expected that the protocol will be contained in millions of devices in a few years.

AuthenticationWireless Transport Layer SecurityTransport Layer Securitybusiness.industryComputer scienceData integrityComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKSWirelessbusinessProtocol (object-oriented programming)Stream cipherBlock cipherComputer network
researchProduct

The Reprocessed Proba-V Collection 2: Product Validation

2021

With the objective to improve data quality in terms of cloud detection, absolute radiometric calibration and atmospheric correction, the PRoject for On-Board Autonomy-Vegetation (PROBA-V) data archive (October 2013 - June 2020) will be reprocessed to Collection 2 (C2). The product validation is organized in three phases and focuses on the intercomparison with PROBA-V Collection 1 (C1), but also consistency analysis with SPOT-VGT, Sentinel-3 SYN-VGT, Terra-MODIS and METOP-AVHRR is foreseen. First preliminary results show the better performance of cloud and snow/ice masking, and indicate that statistical consistency between PROBA-V C2 and C1 are in line with expectations. PROBA-V C2 data are …

Consistency (statistics)Calibration (statistics)business.industryData integrityData qualityAtmospheric correctionEnvironmental scienceCloud computingSnowbusinessRadiometric calibrationRemote sensing2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS
researchProduct

Spatio-temporal Schema Integration with Validation: A Practical Approach

2005

We propose to enhance a schema integration process with a validation phase employing logic-based data models. In our methodology, we validate the source schemas against the data model; the inter-schema mappings are validated against the semantics of the data model and the syntax of the correspondence language. In this paper, we focus on how to employ a reasoning engine to validate spatio-temporal schemas and describe where the reasoning engine is plugged into our integration methodology. The validation phase distinguishes our integration methodology from other approaches. We shift the emphasis on automation from the a priori discovery to the a posteriori checking of the inter-schema mapping…

Data modelDescription logicComputer scienceData integritySchema (psychology)InformationSystems_DATABASEMANAGEMENTSemantic reasonerData miningLogic modelcomputer.software_genrecomputerComputer Science::DatabasesData modeling
researchProduct

On Using Conceptual Modeling for Ontologies

2004

Are database concepts and techniques suitable for ontology design and management? The question has been on the floor for some time already. It gets a new emphasis today, thanks to the focus on ontologies and ontology services due to the spread of web services as a new paradigm for information management. This paper analyzes some of the arguments that are relevant to the debate, in particular the question whether conceptual data models would adequately support the design and use of ontologies. It concludes suggesting a hybrid approach, combining databases and logic-based services.

Information managementComputer scienceNCCR-MICSmedia_common.quotation_subjectProcess ontologyNCCR-MICS/CL4Ontology (information science)computer.software_genreData modelingWorld Wide WebDescription logicData integrityConceptual modelOntologyInformation systemontologiesWeb servicecomputermedia_common
researchProduct

Trust-enhanced intelligent security model

2012

In this paper we propose a trust-enhancement of access control to protect both integrity and confidentiality based on trustworthiness of users performing operations and documents' content analysis. We propose to utilize trustworthiness opinions from subjective logic and express levels of integrity as levels of trustworthiness. We assign confidentiality levels based on contents of documents and use opinions to express trustworthiness of such assignments.

Information privacybusiness.industryComputer scienceInternet privacyAccess controlComputer security modelComputer securitycomputer.software_genreContent analysisSoftware agentData_GENERALData integrityConfidentialitybusinessSubjective logiccomputer2012 6th IEEE INTERNATIONAL CONFERENCE INTELLIGENT SYSTEMS
researchProduct

An Approach to Data Quality Evaluation

2018

This research proposes a new approach to data quality evaluation comprising 3 aspects: (1) data object definition, which quality will be analyzed, (2) quality requirements specification for the data object using Domain Specific Language (DSL), (3) implementation of an executable data quality model that would enable scanning of data object and detect its shortages. Like the Model Driven Architecture (MDA) the data quality modelling is divided into platform independent (PIM) and platform-specific (PSM) models. PIM comprises informal specifications of data quality, PSM describes implementation of data quality model, thus making the data quality model executable. The approbation of the proposed…

SQLbusiness.industryComputer sciencemedia_common.quotation_subjectSoftware requirements specificationcomputer.file_formatLinked dataData modelingData integrityData qualityQuality (business)ExecutableSoftware engineeringbusinesscomputermedia_commoncomputer.programming_language2018 Fifth International Conference on Social Networks Analysis, Management and Security (SNAMS)
researchProduct

Efficient and Lightweight Data Integrity Check in In-Networking Storage Wireless Sensor Networks

2009

In In-networking storage Wireless Sensor Networks, sensed data are stored locally for a long term and retrieved on-demand instead of real-time. To maximize data survival, the sensed data are normally distributively stored at multiple nearby nodes. It arises a problem that how to check and grantee data integrity of distributed data storage in the context of resource constraints. In this paper, a technique called Two Granularity Linear Code (TGLC) that consists of Intra-codes and Inter-codes is presented. An efficient and lightweight data integrity check scheme based on TGLC is proposed. Data integrity can be checked by any one who holds short Inter-codes, and the checking credentials is shor…

Scheme (programming language)Distributed databasebusiness.industryComputer scienceDistributed computingContext (language use)Term (time)Data integrityDistributed data storeOverhead (computing)businessWireless sensor networkcomputerComputer networkcomputer.programming_language2009 IEEE International Symposium on Parallel and Distributed Processing with Applications
researchProduct

Trust-enhanced data integrity model

2012

In this paper we propose an enhancement of data integrity model. The proposed model is based on the idea of Biba integrity model but uses more elaborated integrity measurements. Since integrity can be seen as “trustworthiness of data and resources”, we propose to utilize trustworthiness opinions from subjective logic and express levels of integrity as levels of trustworthiness.

TrustworthinessComputer scienceData integrityTrusted ComputingSubjective logicComputer securitycomputer.software_genrecomputer2012 IEEE 1st International Symposium on Wireless Systems (IDAACS-SWS)
researchProduct

Secure, dependable and publicly verifiable distributed data storage in unattended wireless sensor networks

2010

Published version of an article from the journal : Science in China, Series F: Information Sciences. The original publication is available at Spingerlink. http://dx.doi.org/10.1007/s11432-010-0096-7 In unattended wireless sensor networks (UWSNs), sensed data are stored locally or at designated nodes and further accessed by authorized collectors on demand. This paradigm is motivated by certain scenarios where historical or digest data (e.g., average temperature in a day), instead of real-time data, are of interest. The data are not instantly forwarded to a central sink upon sensing, thereby saving communication energy for transmission. Such a paradigm can also improve data survivability by m…

VDP::Mathematics and natural science: 400::Information and communication science: 420::Communication and distributed systems: 423VDP::Technology: 500::Information and communication technology: 550::Computer technology: 551General Computer Sciencebusiness.industryComputer scienceSecret sharingData integrityComputer data storageDistributed data storeDependabilityVerifiable secret sharingbusinessWireless sensor networkByzantine fault toleranceComputer networkScience China Information Sciences
researchProduct